← Courses

CS589: Machine Learning

Course Description: This course will introduce core machine learning models and algorithms for classification, regression, clustering, and dimensionality reduction. On the theory side, the course will cover the mathematical foundations underlying the most commonly-used machine learning algorithms. It will focus on understanding models and the relationships between them. On the applied side, the course will focus on effectively using machine learning methods to solve real-world problems with an emphasis on model selection, regularization, design of experiments, and presentation and interpretation of results. The course will have assignments that involve both mathematical problems and implementation tasks. Broad topics covered in this course will include classification algorithms in general, decision trees, random forests, probabilistic models, Naive Bayes methods, various ensemble meta-algorithms (such as bagging and boosting), gradient-based techniques, linear regression, logistic regression, neural networks, convolutional neural networks and deep learning, unsupervised learning and clustering algorithms, k-means, hierarchical clustering, and dimensionality reduction techniques.


My Course Reflection

I took this course in Spring 2023. My God, this course was a great introduction to the core machine learning models. We learned and implemented KNN, Decision Trees, Random Forest, Naive Baives, Neural Networks, and so on. The homework was time consuming, but it was rewarding to see how much I have learned from the course. This course wasn't diffcult at all, but must pay attention to the details and slides. Buidling everything from scratch was the most rewarding part.

Code demonstration for Forward and Backward Propagation in Neutral Network

def _forward(self, instance_attributes=None, instance_class=None):
    activation = []
    activation_layer = instance_attributes

    activation_layer = np.insert(activation_layer, 0, 1)
    activation.append(activation_layer)
    for layer in range(1, len(self.network_structure) - 1):
        layer_weight   = self.weights_matrix[layer - 1]
        sigmoid_vector = layer_weight @ activation_layer 
        activation_layer = [self._sigmoid(z) for z in sigmoid_vector]
        activation_layer = np.insert(activation_layer, 0, 1)
        activation.append(activation_layer)

    sigmoid_vector = self.weights_matrix[-1] @ activation_layer
    activation_layer = [self._sigmoid(z) for z in sigmoid_vector]
    activation.append(activation_layer)
    prediction_vector = activation_layer
    return activation, prediction_vector
def _backward(self, delta: list, activation: list, gradients: list, training_index: int): 
    delta = self._computeDeltaHidden(delta = delta, activation = activation)
    gradients_b4_regularized = self._updateGradients(delta = delta, activation = activation, gradients = gradients, instance_num = training_index)
    return gradients_b4_regularized